convert managed table on databricks to external table

sometimes I had AnalysisException: Cannot set or change the preserved property key: ‘EXTERNAL’ when converting the table.

import org.apache.spark.sql.catalyst.TableIdentifier
import org.apache.spark.sql.catalyst.catalog.CatalogTable
import org.apache.spark.sql.catalyst.catalog.CatalogTableType

val identifier = TableIdentifier("table_name", Some("database"))
val oldTable = spark.sessionState.catalog.getTableMetadata(identifier)
val newTableType = CatalogTableType.EXTERNAL

val alteredTable = oldTable.copy(tableType = newTableType)

spark.sessionState.catalog.alterTable(alteredTable)

this will do the trick.

Leave a comment